251 research outputs found

    Development and evaluation of a Hadamard transform imaging spectrometer and a Hadamard transform thermal imager

    Get PDF
    A spectrometric imager and a thermal imager, which achieve multiplexing by the use of binary optical encoding masks, were developed. The masks are based on orthogonal, pseudorandom digital codes derived from Hadamard matrices. Spatial and/or spectral data is obtained in the form of a Hadamard transform of the spatial and/or spectral scene; computer algorithms are then used to decode the data and reconstruct images of the original scene. The hardware, algorithms and processing/display facility are described. A number of spatial and spatial/spectral images are presented. The achievement of a signal-to-noise improvement due to the signal multiplexing was also demonstrated. An analysis of the results indicates both the situations for which the multiplex advantage may be gained, and the limitations of the technique. A number of potential applications of the spectrometric imager are discussed

    Chloroquine, the Coronavirus Crisis, and Neurodegeneration: A Perspective

    Get PDF
    On the verge of the ongoing coronavirus pandemic, in vitro data suggested that chloroquine, and its analog hydroxychloroquine, may be useful in controlling SARS-CoV-2 infection. Efforts are ongoing in order to test this hypothesis in clinical trials. Some studies demonstrated no evidence of efficacy, whereas in some cases results were retracted after reporting. Despite the lack of scientific validation, support for the use of these compounds continues from various influencers. At the cellular level, the lysosomotropic drug chloroquine accumulates in acidic organelles where it acts as an alkalizing agent with possible downstream effects on several cellular pathways. In this perspective, we discuss a possible modulatory role of these drugs in two shared features of neurodegenerative diseases, the cellular accumulation of aberrantly folded proteins and the contribution of neuroinflammation in this pathogenic process. Certainly, the decision on the use of chloroquine must be determined by its efficacy in the specific clinical situation. However, at an unprecedented time of a potential widespread use of chloroquine, we seek to raise awareness of its potential impact in ongoing clinical trials evaluating disease-modifying therapies in neurodegeneration

    ANS hard X-ray experiment development program

    Get PDF
    The hard X-ray (HXX) experiment is one of three experiments included in the Dutch Astronomical Netherlands Satellite, which was launched into orbit on 30 August 1974. The overall objective of the HXX experiment is the detailed study of the emission from known X-ray sources over the energy range 1.5-30keV. The instrument is capable of the following measurements: (1) spectral content over the full energy range with an energy resolution of approximately 20% and time resolution down to 4 seconds; (2) source time variability down to 4 milliseconds; (3) silicon emission lines at 1.86 and 2.00keV; (4) source location to a limit of one arc minute in ecliptic latitude; and (5) spatial structure with angular resolution of the arc minutes. Scientific aspects of experiment, engineering design and implementation of the experiment, and program history are included

    K0-Sigma+ Photoproduction with SAPHIR

    Full text link
    Preliminary results of the analysis of the reaction p(gamma,K0)Sigma+ are presented. We show the first measurement of the differential cross section and much improved data for the total cross section than previous data. The data are compared with model predictions from different isobar and quark models that give a good description of p(gamma,K+)Lambda and p(gamma,K+)Sigma0 data in the same energy range. Results of ChPT describe the data adequately at threshold while isobar models that include hadronic form factors reproduce the data at intermediate energies.Comment: 4 pages, Latex2e, 4 postscript figures. Talk given at the International Conference on Hypernuclear and Strange Particle Physics (HYP97), Brookhaven National Laboratory, USA, October 13-18, 1997. To be published in Nucl. Phys. A. Revised version due to changes in experimental dat

    Whole‐brain microscopy reveals distinct temporal and spatial efficacy of anti‐Aβ therapies

    Full text link
    Many efforts targeting amyloid-β (Aβ) plaques for the treatment of Alzheimer's Disease thus far have resulted in failures during clinical trials. Regional and temporal heterogeneity of efficacy and dependence on plaque maturity may have contributed to these disappointing outcomes. In this study, we mapped the regional and temporal specificity of various anti-Aβ treatments through high-resolution light-sheet imaging of electrophoretically cleared brains. We assessed the effect on amyloid plaque formation and growth in Thy1-APP/PS1 mice subjected to β-secretase inhibitors, polythiophenes, or anti-Aβ antibodies. Each treatment showed unique spatiotemporal Aβ clearance, with polythiophenes emerging as a potent anti-Aβ compound. Furthermore, aligning with a spatial-transcriptomic atlas revealed transcripts that correlate with the efficacy of each Aβ therapy. As observed in this study, there is a striking dependence of specific treatments on the location and maturity of Aβ plaques. This may also contribute to the clinical trial failures of Aβ-therapies, suggesting that combinatorial regimens may be significantly more effective in clearing amyloid deposition. Keywords: Alzheimer's disease; amyloid-beta; brain; light-sheet microscopy; tissue clearin

    Computational modeling of beam-customization devices for heavy-charged-particle radiotherapy

    Get PDF
    A model for beam customization with collimators and a range-compensating filter based on the phase-space theory for beam transport is presented for dose distribution calculation in treatment planning of radiotherapy with protons and heavier ions. Independent handling of pencil beams in conventional pencil-beam algorithms causes unphysical collimator-height dependence in the middle of large fields, which is resolved by the framework comprised of generation, transport, collimation, regeneration, range-compensation, and edge-sharpening processes with a matrix of pencil beams. The model was verified to be consistent with measurement and analytic estimation at a submillimeter level in penumbra of individual collimators with a combinational-collimated carbon-ion beam. The model computation is fast, accurate, and readily applicable to pencil-beam algorithms in treatment planning with capability of combinational collimation to make best use of the beam-customization devices.Comment: 16 pages, 5 figure

    A GPU implementation of a track-repeating algorithm for proton radiotherapy dose calculations

    Full text link
    An essential component in proton radiotherapy is the algorithm to calculate the radiation dose to be delivered to the patient. The most common dose algorithms are fast but they are approximate analytical approaches. However their level of accuracy is not always satisfactory, especially for heterogeneous anatomic areas, like the thorax. Monte Carlo techniques provide superior accuracy, however, they often require large computation resources, which render them impractical for routine clinical use. Track-repeating algorithms, for example the Fast Dose Calculator, have shown promise for achieving the accuracy of Monte Carlo simulations for proton radiotherapy dose calculations in a fraction of the computation time. We report on the implementation of the Fast Dose Calculator for proton radiotherapy on a card equipped with graphics processor units (GPU) rather than a central processing unit architecture. This implementation reproduces the full Monte Carlo and CPU-based track-repeating dose calculations within 2%, while achieving a statistical uncertainty of 2% in less than one minute utilizing one single GPU card, which should allow real-time accurate dose calculations

    Measurement of the Bottom-Strange Meson Mixing Phase in the Full CDF Data Set

    Get PDF
    We report a measurement of the bottom-strange meson mixing phase \beta_s using the time evolution of B0_s -> J/\psi (->\mu+\mu-) \phi (-> K+ K-) decays in which the quark-flavor content of the bottom-strange meson is identified at production. This measurement uses the full data set of proton-antiproton collisions at sqrt(s)= 1.96 TeV collected by the Collider Detector experiment at the Fermilab Tevatron, corresponding to 9.6 fb-1 of integrated luminosity. We report confidence regions in the two-dimensional space of \beta_s and the B0_s decay-width difference \Delta\Gamma_s, and measure \beta_s in [-\pi/2, -1.51] U [-0.06, 0.30] U [1.26, \pi/2] at the 68% confidence level, in agreement with the standard model expectation. Assuming the standard model value of \beta_s, we also determine \Delta\Gamma_s = 0.068 +- 0.026 (stat) +- 0.009 (syst) ps-1 and the mean B0_s lifetime, \tau_s = 1.528 +- 0.019 (stat) +- 0.009 (syst) ps, which are consistent and competitive with determinations by other experiments.Comment: 8 pages, 2 figures, Phys. Rev. Lett 109, 171802 (2012

    First steps towards a fast-neutron therapy planning program

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The Monte Carlo code GEANT4 was used to implement first steps towards a treatment planning program for fast-neutron therapy at the FRM II research reactor in Garching, Germany. Depth dose curves were calculated inside a water phantom using measured primary neutron and simulated primary photon spectra and compared with depth dose curves measured earlier. The calculations were performed with GEANT4 in two different ways, simulating a simple box geometry and splitting this box into millions of small voxels (this was done to validate the voxelisation procedure that was also used to voxelise the human body).</p> <p>Results</p> <p>In both cases, the dose distributions were very similar to those measured in the water phantom, up to a depth of 30 cm. In order to model the situation of patients treated at the FRM II MEDAPP therapy beamline for salivary gland tumors, a human voxel phantom was implemented in GEANT4 and irradiated with the implemented MEDAPP neutron and photon spectra. The 3D dose distribution calculated inside the head of the phantom was similar to the depth dose curves in the water phantom, with some differences that are explained by differences in elementary composition. The lateral dose distribution was studied at various depths. The calculated cumulative dose volume histograms for the voxel phantom show the exposure of organs at risk surrounding the tumor.</p> <p>Conclusions</p> <p>In order to minimize the dose to healthy tissue, a conformal treatment is necessary. This can only be accomplished with the help of an advanced treatment planning system like the one developed here. Although all calculations were done for absorbed dose only, any biological dose weighting can be implemented easily, to take into account the increased radiobiological effectiveness of neutrons compared to photons.</p

    Theoretical analysis of the dose dependence of the oxygen enhancement ratio and its relevance for clinical applications

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>The increased resistance of hypoxic cells to ionizing radiation is usually believed to be the primary reason for treatment failure in tumors with oxygen-deficient areas. This oxygen effect can be expressed quantitatively by the oxygen enhancement ratio (OER). Here we investigate theoretically the dependence of the OER on the applied local dose for different types of ionizing irradiation and discuss its importance for clinical applications in radiotherapy for two scenarios: small dose variations during hypoxia-based dose painting and larger dose changes introduced by altered fractionation schemes.</p> <p>Methods</p> <p>Using the widespread Alper-Howard-Flanders and standard linear-quadratic (LQ) models, OER calculations are performed for T1 human kidney and V79 Chinese hamster cells for various dose levels and various hypoxic oxygen partial pressures (pO2) between 0.01 and 20 mmHg as present in clinical situations <it>in vivo</it>. Our work comprises the analysis for both low linear energy transfer (LET) treatment with photons or protons and high-LET treatment with heavy ions. A detailed analysis of experimental data from the literature with respect to the dose dependence of the oxygen effect is performed, revealing controversial opinions whether the OER increases, decreases or stays constant with dose.</p> <p>Results</p> <p>The behavior of the OER with dose per fraction depends primarily on the ratios of the LQ parameters alpha and beta under hypoxic and aerobic conditions, which themselves depend on LET, pO2 and the cell or tissue type. According to our calculations, the OER variations with dose <it>in vivo </it>for low-LET treatments are moderate, with changes in the OER up to 11% for dose painting (1 or 3 Gy per fraction compared to 2 Gy) and up to 22% in hyper-/hypofractionation (0.5 or 20 Gy per fraction compared to 2 Gy) for oxygen tensions between 0.2 and 20 mmHg typically measured clinically in hypoxic tumors. For extremely hypoxic cells (0.01 mmHg), the dose dependence of the OER becomes more pronounced (up to 36%). For high LET, OER variations up to 4% for the whole range of oxygen tensions between 0.01 and 20 mmHg were found, which were much smaller than for low LET.</p> <p>Conclusions</p> <p>The formalism presented in this paper can be used for various tissue and radiation types to estimate OER variations with dose and help to decide in clinical practice whether some dose changes in dose painting or in fractionation can bring more benefit in terms of the OER in the treatment of a specific hypoxic tumor.</p
    corecore